Regularized Chained Deep Neural Network Classifier for Multiple Annotators
نویسندگان
چکیده
The increasing popularity of crowdsourcing platforms, i.e., Amazon Mechanical Turk, changes how datasets for supervised learning are built. In these cases, instead having labeled by one source (which is supposed to be an expert who provided the absolute gold standard), databases holding multiple annotators provided. However, most state-of-the-art methods devoted from experts assume that labeler’s behavior homogeneous across input feature space. Besides, independence constraints imposed on annotators’ outputs. This paper presents a regularized chained deep neural network deal with classification tasks annotators. introduced method, termed RCDNN, jointly predicts ground truth label and performance space samples. turn, RCDNN codes interdependencies among analyzing layers’ weights includes l1, l2, Monte-Carlo Dropout-based regularizers over-fitting issue in models. Obtained results (using both simulated real-world annotators) demonstrate can multi-labelers scenarios tasks, defeating techniques.
منابع مشابه
Recognition of Multiple PQ Issues using Modified EMD and Neural Network Classifier
This paper presents a new framework based on modified EMD method for detection of single and multiple PQ issues. In modified EMD, DWT precedes traditional EMD process. This scheme makes EMD better by eliminating the mode mixing problem. This is a two step algorithm; in the first step, input PQ signal is decomposed in low and high frequency components using DWT. In the second stage, the low freq...
متن کاملShakeout: A New Regularized Deep Neural Network Training Scheme
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practical problems. The invention of effective training techniques largely contributes to this success. The so-called "Dropout" training scheme is one of the most powerful tool to reduce over-fitting. From the statistic point of view, Dropout works by implicitly imposing an L2 regularizer on the weights....
متن کاملRegularized sequence-level deep neural network model adaptation
We propose a regularized sequence-level (SEQ) deep neural network (DNN) model adaptation methodology as an extension of the previous KL-divergence regularized cross-entropy (CE) adaptation [1]. In this approach, the negative KL-divergence between the baseline and the adapted model is added to the maximum mutual information (MMI) as regularization in the sequence-level adaptation. We compared ei...
متن کاملManifold regularized deep neural networks
Deep neural networks (DNNs) have been successfully applied to a variety of automatic speech recognition (ASR) tasks, both in discriminative feature extraction and hybrid acoustic modeling scenarios. The development of improved loss functions and regularization approaches have resulted in consistent reductions in ASR word error rates (WERs). This paper presents a manifold learning based regulari...
متن کاملBitNet: Bit-Regularized Deep Neural Networks
We present a novel regularization scheme for training deep neural networks. The parameters of neural networks are usually unconstrained and have a dynamic range dispersed over the real line. Our key idea is to control the expressive power of the network by dynamically quantizing the range and set of values that the parameters can take. We formulate this idea using a novel end-to-end approach th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied sciences
سال: 2021
ISSN: ['2076-3417']
DOI: https://doi.org/10.3390/app11125409